Regularization and stability in reservoir networks with output feedback

نویسندگان

  • René Felix Reinhart
  • Jochen J. Steil
چکیده

Output feedback is crucial for autonomous and parameterized pattern generation with reservoir networks. Read-out learning affects the output feedback loop and can lead to error amplification. Regularization is therefore important for both, generalization and reduction of error amplification. We show that regularization of the reservoir and the read-out layer reduces the risk of error amplification, mitigates parameter dependency and boosts the task-specific performance of reservoir networks with output feedback. We discuss the deeper connection between regularization of the learning process and stability of the trained network.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Reservoir regularization stabilizes learning of Echo State Networks with output feedback

Output feedback is crucial for autonomous and parameterized pattern generation with reservoir networks. Read-out learning can lead to error amplification in these settings and therefore regularization is important for both generalization and reduction of error amplification. We show that regularization of the inner reservoir network mitigates parameter dependencies and boosts the task-specific ...

متن کامل

Balancing of neural contributions for multi-modal hidden state association

We generalize the formulation of associative reservoir computing networks to multiple input modalities and demonstrate applications in image and audio processing scenarios. Robust association with reservoir networks requires to cope with potential error amplification of output feedback dynamics and to handle differently sized input and output modalities. We propose a dendritic neuron model in c...

متن کامل

Multi-directional continuous association with input-driven neural dynamics

We present an input-driven dynamical system approach to continuous association. Previous formulations of associative reservoir computing networks and associative extreme learning machines are unified and generalized to multiple modalities. Association in these networks proceeds by externally driving parts of the network. Through continuous variation of driving inputs, a continuous association o...

متن کامل

Saturated Neural Adaptive Robust Output Feedback Control of Robot Manipulators:An Experimental Comparative Study

In this study, an observer-based tracking controller is proposed and evaluatedexperimentally to solve the trajectory tracking problem of robotic manipulators with the torque saturationin the presence of model uncertainties and external disturbances. In comparison with the state-of-the-artobserver-based controllers in the literature, this paper introduces a saturated observer-based controllerbas...

متن کامل

Regularization by Intrinsic Plasticity and Its Synergies with Recurrence for Random Projection Methods

Neural networks based on high-dimensional random feature generation have become popular under the notions extreme learning machine (ELM) and reservoir computing (RC). We provide an in-depth analysis of such networks with respect to feature selection, model complexity, and regularization. Starting from an ELM, we show how recurrent connections increase the effective complexity leading to reservo...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Neurocomputing

دوره 90  شماره 

صفحات  -

تاریخ انتشار 2012